On learning context-free and context-sensitive languages

نویسندگان

  • Mikael Bodén
  • Janet Wiles
چکیده

The long short-term memory (LSTM) is not the only neural network which learns a context sensitive language. Second-order sequential cascaded networks (SCNs) are able to induce means from a finite fragment of a context-sensitive language for processing strings outside the training set. The dynamical behavior of the SCN is qualitatively distinct from that observed in LSTM networks. Differences in performance and dynamics are discussed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Critical Thinking Ability and Vocabulary Learning Strategy Use: The Case of EFL Learners in an ESL Context

Critical thinking is considered important in the field of education due to its possible effects on language learning. Therefore, the reasons behind the success and failure of language learners have provoked researchers to examine different aspects of the language learning process. Moreover, improving learners’ critical thinking ability in the course of learning will enable students to rely on t...

متن کامل

Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM

In response to Rodriguez's recent article (2001), we compare the performance of simple recurrent nets and long short-term memory recurrent nets on context-free and context-sensitive languages.

متن کامل

LSTM recurrent networks learn simple context-free and context-sensitive languages

Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks (RNNs). We demonstrate LSTMs superior performance on context-free language benchmarks for RNNs, and show that it works even better than previous hardwired or highly specialized architectures. To the best of our knowledge, LSTM ...

متن کامل

Using Contextual Representations to Efficiently Learn Context-Free Languages

We present a polynomial update time algorithm for the inductive inference of a large class of context-free languages using the paradigm of positive data and a membership oracle. We achieve this result by moving to a novel representation, called Contextual Binary Feature Grammars (CBFGs), which are capable of representing richly structured context-free languages as well as some context sensitive...

متن کامل

Context-free and context-sensitive dynamics in recurrent neural networks

Continuous-valued recurrent neural networks can learn mechanisms for processing context-free languages. The dynamics of such networks is usually based on damped oscillation around fixed points in state space and requires that the dynamical components are arranged in certain ways. It is shown that qualitatively similar dynamics with similar constraints hold for abc, a context-sensitive language....

متن کامل

Incremental training of first order recurrent neural networks to predict a context-sensitive language

In recent years it has been shown that first order recurrent neural networks trained by gradient-descent can learn not only regular but also simple context-free and context-sensitive languages. However, the success rate was generally low and severe instability issues were encountered. The present study examines the hypothesis that a combination of evolutionary hill climbing with incremental lea...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 13 2  شماره 

صفحات  -

تاریخ انتشار 2002